Order:
  1.  95
    What a Rational Parser Would Do.John T. Hale - 2011 - Cognitive Science 35 (3):399-443.
    This article examines cognitive process models of human sentence comprehension based on the idea of informed search. These models are rational in the sense that they strive to find a good syntactic analysis quickly. Informed search derives a new account of garden pathing that handles traditional counterexamples. It supports a symbolic explanation for local coherence as well as an algorithmic account of entropy reduction. The models are expressed in a broad framework for theories of human sentence comprehension.
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   16 citations  
  2.  16
    Quantifying Structural and Non‐structural Expectations in Relative Clause Processing.Zhong Chen & John T. Hale - 2021 - Cognitive Science 45 (1):e12927.
    Information‐theoretic complexity metrics, such as Surprisal (Hale, 2001; Levy, 2008) and Entropy Reduction (Hale, 2003), are linking hypotheses that bridge theorized expectations about sentences and observed processing difficulty in comprehension. These expectations can be viewed as syntactic derivations constrained by a grammar. However, this expectation‐based view is not limited to syntactic information alone. The present study combines structural and non‐structural information in unified models of word‐by‐word sentence processing difficulty. Using probabilistic minimalist grammars (Stabler, 1997), we extend expectation‐based models to include (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  3.  12
    Modeling Structure‐Building in the Brain With CCG Parsing and Large Language Models.Miloš Stanojević, Jonathan R. Brennan, Donald Dunagan, Mark Steedman & John T. Hale - 2023 - Cognitive Science 47 (7):e13312.
    To model behavioral and neural correlates of language comprehension in naturalistic environments, researchers have turned to broad‐coverage tools from natural‐language processing and machine learning. Where syntactic structure is explicitly modeled, prior work has relied predominantly on context‐free grammars (CFGs), yet such formalisms are not sufficiently expressive for human languages. Combinatory categorial grammars (CCGs) are sufficiently expressive directly compositional models of grammar with flexible constituency that affords incremental interpretation. In this work, we evaluate whether a more expressive CCG provides a better (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  4.  17
    Automaton theories of human sentence comprehension.John T. Hale - 2014 - Stanford, California: CSLI Publications, Center for the Study of Language and Information.
    How could the kinds of grammars that linguists write actually be used in models of perceptual processing? This book relates grammars to cognitive architecture. It shows how incremental parsing works, step-by-step, and how specific learning rules might lead to frequency-sensitive preferences. Along the way, Hale reconsiders garden-pathing, the parallel/serial distinction and information-theoretical complexity metrics such as surprisal. A "must" for cognitive scientists of language. ".
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark